Skip to main content

tv   Stanford University - Investigating Facebook  CSPAN  March 9, 2019 8:00pm-9:27pm EST

8:00 pm
court, and public-policy events in washington, d.c., and around the country. c-span is brought to you by your cable or satellite provider. next, producers of the pbs documentary "the facebook dilemma," talk about their investigation into the platform. in 2014 interview with congressman ralph hall on his career in the house. after that, alaskan congressman for beingis loud it the longest-serving republican in u.s. history. now, producers of the pbs documentary "the facebook dilemma," talk about their investigation into the social media platform. stanford university hosted this event. [applause] tonight's symposium follows
8:01 pm
the mantra think globally, act locally. sitting here, we are five miles from the headquarters of facebook. this social media platform has great local impact. the area students strive to in offer the, faculty company both consulting insights and criticism. the decisions made by facebook echo across the world, affecting the platform's more than 2 .illion monthly active users pbs aired a two hour documentary that investigated the company's impact on democracies across the grow. lobe. haveht we are proud to three people who explored the story behind the story, including the particular challenges in covering a powerful social media platform.
8:02 pm
after brief introductions, we will turn to a panel discussion about how the facebook dilemma came to be and what takeaways these journalists have from experiences and putting together the documentary. we provided audience members with notecards. as questions occur, please feel free to write them down. they will be collected and forwarded to me so that for the last half of the panel we can focus on questions from you. now, for the introductions. anya is an award-winning producer and journalist to joint frontlines independent journalism group in 2014. aning begun her career as assistant producer. in between she spent nine years at 60 minutes, working on stories that ranged from the violence in mexico to the destruction of coral reef, to the lack of ability for -- lack
8:03 pm
of accountability for prosecutors accused of misconduct. she graduated from the university of california and columbia universities graduate school of journalism. a foundingy is member of frontline. in addition to the facebook dilemma, he recently produced war on the epa, which investigated how scott pruitt went from fighting the agency to running it and rolling back years of policy. his film told the dramatic story of a guantanamo detainee released from the controversial u.s. prison after more than a decade. in collaboration with npr, the film illustrated the struggle over freeing prisoners once deemed international terrorists. he worked for 60 minutes, his investigations revealed wrongdoing by major banks,
8:04 pm
credit reporting agencies, disability lawyers, and arson investigators. prior to joining 60 minutes he worked for cnbc and the nation. he reported on a range of topics from youth politics in pakistan to the european debt crisis, to the rebuilding of new orleans after hurricane katrina. james has received several honors for his work, he is a graduate of the university of pennsylvania. has been a reporter for the washington post for 30 years. she covers mostly national security issues and has been a reporter and contributor to pbs and a contributor to nbc, cbs news, and 60 minutes. priest has received numerous awards including a pulitzer prize in 2008 and 2006.
8:05 pm
is the author of two best-selling books, "the and top-secret america, the rise of the new american security state. she is also the cofounder of pressuncuffed.org. thank you all and thank the audience for coming here on what has turned out to be a dark and stormy night in palo alto. james, where did the idea for this documentary come from and how did you initially start your reporting process? initially came -- anya and i were trying to remember it this morning. we were working on the epa film. the bayes out here in area. we were recognizing there was a story to tell -- this was after
8:06 pm
the 2016 election. there was a lot of talk about russian interference and all sorts of security concerns. happened was our epa film aired in october. in november of 2017, the testimony of the general counsel's from facebook and some of the other tech companies appeared in congress. that was really the moment when anya and i decided this was something we need to look into. in part, because of the non-answers from the attorneys at that point from the companies about what had happened during the election. we just thought it was good to look into. >> what was the first thing you did? james: the first thing you do any time is read up as much as you can. people about who are some of the smartest people in the field to talk to. both critics of these companies
8:07 pm
as well as people that have worked for them. anya is really an expert at finding current and former employees to speak to. you really speak on background interest and get the lay of the land. it was really to create a database of people that we could reach out to and have conversations about. >> i know when you walk in, you are greeted with a smile and a nondisclosure agreement, employees sign these and company lawyers often police them. how did you get people who signed nda's to speak with you and did lawyers get involved? anya: this was probably the most difficult story -- i had a hard time getting people to speak, even off the record. people were incredibly nervous.
8:08 pm
there was a lot of networking. a lot of going around and talking to one person. they would introduce you to somebody else. people seem very reluctant to go on camera or even give their name. it is a tight community and people are worrying about appearing to be critics of their previous employer. >> the other reluctance from the nda's is about naming specific people or talking about specific events. ended uppeople that speaking on the record really didn't want to talk about anything specific. the specific person especially. have been acity may trigger for some of the agreements. there were a few people that were kind of unconcerned with the nda's.
8:09 pm
anya: most people were very different. >> did you ever get the sense that somebody would talk to you? yeah, definitely. someone at the company, not necessarily lawyers. >> that is a great transition. dana, you have covered intelligence agencies. did reporting on facebook compared to what you reported on intelligence? dana: it rang all the alarms. they were more worried than the cia people, the people who worked at the agency. lives ind their whole a classified arena. i found that these people who had much less to lose because there was no way someone who started talking to us would be
8:10 pm
legally prosecuted, that would bring so much publicity. they were culturally so scared. -- i had already started looking at facebook for the post. talking to an intermediary, phil bennett, who brought me into the project. just from that experience, the comparison between talking to cia people about classified information, dod people about operations, and here i was in the civilian america having this very strange experience. was a lot of these people were very young. they were probably scared to death. company that claims happya community based
8:11 pm
company that is positive instill still that in people? it rang every alarm bell. whatrticular interest was was happening overseas. i had done a little bit of work on that. basici followed some rules for stories and journalism. always follow the money, which we didn't do. that is a tried and true mantra. the other is who knew what when? that is one of the main things we started to pursue. once we got our handle around a little bit on what we were going to do, we wanted to say who knew what when? we drew that out in the foreign area. for people who aren't journalists, the one thing i would like to explain is that even though a film or article authoritative,
8:12 pm
usually you do not start with a and get to b. squiggly line to get to the end. that is a process i think every story follows. james: i think when dana joined us, in part what she brought to it was thinking about the who, what, when aspect. were going to have to go chronologically in the film. who washe interests was talking internally and externally at the company warning about all sorts of then reared their head and ended up on the front pages after the 2016 election and onward, whether it be privacy concerns or concerns about malicious actors.
8:13 pm
problems, it was a really helpful way of us structuring our reporting to fill in chronologically what was happening in terms of insiders and outsiders talking about some of the issues that ended up being major problems. tv works best with pictures, pictures need a narrative arc. as a storyteller, what choices did you make to visually convey how the company was operating and what its impacts were? james: we had a phenomenal archive coproducer on the film, megan robertson, who is an expert at finding footage. really, kind of giving the mandate by our executive producer to think of archival footage as investigative reporting, where you are digging
8:14 pm
into the archives to see what it was that the principles, whether it was mark zuckerberg saying about particular issues that we or testvestigate whether that was happening inside the company. megan relied at first on something that was from the university of wisconsin, the zuckerberg file. it was a research professor who was assembling everything mark zuckerberg had said since he basically became a public figure in 2004-2000 five. whether in video, audio, in print, and chronically. megan, there were lots of holes in the archive, megan would then research that. basically, that was one way to bring this to life. >> in setting up new interviews, i know both 60 minutes and frontline you had been able to
8:15 pm
figure out some way to get people to talk about extremely controversial topics. i wondered what happens when they regret what they said? the main way i think we get people to do interviews is by being genuinely interested in story. their side of the to comment everything that it would be better to know your side of what happened if there is something you are worried about, tell us so we could really understand the full scope. we have had instances where people are not happy with their interviews. dana: there was a couple people during the interviews where they wanted to say something off the record. they would either say stop the cameras or something. even if they didn't say stop the cameras, we would of course honor that. said in the beginning,
8:16 pm
if there is something you don't want to have on camera, that is fine. part of the process is figuring out what the story is. tore is a lot of wanting figure out the story is in the second is what can you get on camera once you figure that out. they are not always the same. knowing what the story is is the most important thing. ways toalways find somehow get on camera or in the narrative. the first half of the film, for those who haven't seen it, is really largely about the business model that changed something we know best, zuckerberg connecting you with kids, family, grandchildren and all that. to a multinational corporation that is everywhere and so dominant.
8:17 pm
we even had to sort of train ourselves and it didn't really work all the time to call them a corporation. we actually never do that in the film. scriptber trying in the and it felt so weird. they were so good at telling you there was something else. for those of you, including myself, who are not huge and still didn't understand the business model, that is what we wanted to explain. you go ande when become a public company, no matter what company you are, the dynamic of the company becomes the same. you have to please your stockholders. you do that by making more money. that is where the business model starts to change. in the end, it ends up doing
8:18 pm
some bad things, unethical things. we do like three hour interviews with people. they are very meandering and we don't always know what we are looking for. sometimes we stumble into things we were anticipating talking about at all. often in the middle of interviews, people will say we are running up on something that i feel nervous talking about or in an arena where we want to respect people's boundaries. some people will call us afterwards and sam not comfortable with something i said. we will always hear them out and try and understand what it was. if they say something they think will put them in danger, or something they are embarrassed about. case, someone from facebook was very unhappy with their interview. that was difficult.
8:19 pm
we really had to listen to their concerns. day, kind of the decided they weren't properly prepared for the interview. that was on them i think is what we felt. had properly prepared the public affairs people telling them what we wanted from these few days at facebook. james: one last thing i would add is sometimes it is something unique where you have the transparency project. all of thef not interviews that we did we published the entire transcript of the interview online. you can watch the entire interview and read the entire transcript. i think the reason we do that in
8:20 pm
part is because we feel like it is a public record and it is important for the public to know. what to judge how we have we hope properly characterizes someone's story as he or she told it. very wellfends us against allegations of being fake news in some way. strange position of filmmakers and journalists to do that. no one really does that. it is really important for frontline to do that as a public service. people that have been unhappy with how they have been edited and things like that, that happens occasionally. i think we generally do a good job of that, being fair and hearing people out and properly characterizing them. that is what we do.
8:21 pm
their transcripts are out there. are 29 of those and one thing that i think is interesting is you have highlighted in the transcript and in the interview what you used in the actual documentary. the learning project of a look at the editorial judgment, you could see how did they select 30 or 40 seconds. the favorite one i watched was with president trump's current i wanted toager and ask you, what is it like to interview somebody who says directly to your face that the press are the enemy of the people? james: he tempered that to some degree after we had an exchange about it. hads odd that we actually that exchange in an interview about facebook, that is really what the interview was about.
8:22 pm
how the trump campaign used facebook as an advertising tool , howlso what had happened he responded to the idea that there was a disinformation campaign that may have helped his candidate. we got into a discussion about the enemy of the people charged nothe dialed back and said all the press is the enemy of the people. he feels as though -- it is a long and laborious exchange that happens to be out there. what is weird about it is that when you put your transcript out there and you are transparent about it, on youtube, the vitriolic comments about the interview as if it was some sort of battle between me and brad parcell is kind of astonishing. of comments,t
8:23 pm
predominantly comments about left-wing media challenging him. it is a strange thing to have your transcripts out there. ethic is fantastic. people can judge for themselves. dana: speaking of the transcripts, which you can see at frontline.org. the two really interesting people that are not in our film that are very thought-provoking and i would recommend you looking at, one is the privacy expert at facebook. he had previously worked -- she was a phd in computer science, i think. his explanation of what he thinks as privacy is very different than what probably most of us would think of. to not haveight
8:24 pm
your data taken in order to manipulate you when you do not know it. it is not just not knowing where i live or my social security number, the idea of unknown manipulation. it is much worth reading. the other one is donald graham. of the post.ner we got him to -- we tried to get them to talk about zuckerberg. muchuldn't really tell us that was new or interesting. what is interesting is what he said about who should be the , and do you speech really want the government to do this? industrynt private that is somewhat reactive to
8:25 pm
public demand to do it? i think he makes a very good counterargument to those that say it should be the government. in that same week he was talking, vice president pence was dissing google i think it was. claiming that their algorithms were tweaked towards the left. he pointed to that. those would be the two i would recommend that are not in the film. one person's responsible operation of a social media platform is another person's censorship. i wonder how you view facebook as a global decision-maker about the elevation or suppression of content that could excite violence or create social division? james: that is the dilemma. we titled it the facebook dilemma. conversation with the former chief of security for
8:26 pm
the company. basically, better than anyone he sort of laid out the dilemmas. be careful what you wish for. you have an incredibly powerful internet platform here. you have other incredibly powerful internet platforms. what happens if you draw out a scenario where they become speechve in regulating and what happens further down the line when there are ai tools? scenario end up with a when it comes to political speech. thinking about leadership at these companies. as critical as the film may have been or comes across when it comes to mark zuckerberg, with someone else in charge further is under noe who neutral"on to keep a "
8:27 pm
political platform. what if there were internal decisions that work in fact biased? that is another scenario that is pretty frightful. i don't know the solution to that. i think that is something that smart minds at campuses like stanford should be discussing as to where do we want to go? i think is a question that facebook itself takes really seriously. and have been reluctant to exercise power because once you take responsibility for speech, you kind of own it. it was not just good business sense in some way, also there was a philosophy behind it. i am quite afraid of the andrnative of what he
8:28 pm
others talk to us about and where this could lead. these are basically unaccountable companies that have tremendous power over the mediation of speech in our society. thereing is for sure that needs to be some transparency in order to judge what they are doing and how well they are doing it. society,lked about our in the film you present evidence that facebook's operations destabilize democracies in myanmar, philippines, and ukraine. i'm wondering if you have seen evidence that the company has developed the structure and expertise to mitigate that in the future? they certainly want you to think you have. i don't think that is the case, honestly. the countries that you named and are having the worst problems, they are also expanding at the
8:29 pm
same time. and, they are relying on local .artners they are small ngos, sometimes they are news organizations that are struggling to be news organizations. all of a sudden, they get a contract from facebook to be the -- i call them sensors, they call them content monitors. recently, i was at a journalism , newrence in santa fe mexico. allad 28 journalists from over the world. including one from mongolia. iss woman from mongolia telling me their problem with facebook, which she described as exactly the problem the ukrainians had on facebook. it had to do with a lot of
8:30 pm
people who were anti-democratic trying to suppress the pro-democracy voices by to facebook about something they were seeing that was hate speech. facebook not having the capacity, because of the language problem to really know if it was hate speech or not. buckling under the pressure of a lot of complaints, in mongolia, this woman described exactly that. it continues to expand and doesn't yet have the capacity, one of the things we said, facebook did open up to some extent. the two most interesting things, they let us sit in on two meetings. one was about content moderation. they now have slur lists for
8:31 pm
every country or they are developing them in every country they operate in. they recognize those lists change all the time given the context of the slur. a slur today might not be a slur tomorrow. who aree young people trying earnestly and hardly to figure out how to do the right thing. this is a huge issue. sure we saw there was a huge amount of resources devoted to it. >> you are local. have you seen a reaction from people at facebook to the film? have you talked to the people that you talk to? anya: it is a strange experience because afterward it is very quiet. it is nice to be here tonight because i am looking forward to hearing what people thought. i don't get a lot of reaction.
8:32 pm
some of the people in the film and some of the people who were helpful along the way. there is surprisingly little feedback. >> that has done well in terms of streaming, right? one interesting anecdote, which was an anecdote from someone who is there still. it was that a lot of the younger employees, a lot of them take the bus to the park from the city every day. on the buses, people were watching in the days after the film aired. there were a lot of questions internally. in part because institutional memory especially when you have younger employees is short. andt of the engineers product designers and other people who worked there were not necessarily familiar with some of the history that we told.
8:33 pm
the was discussed at company about privacy concerns or other things. that therem someone were a lot of people internally that were watching it and had serious questions to ask internally about what was known and really get a good sense of the history of the company and how it had approached different problems, like the speech problem. in 2008, the company was enormous. we tell them the story, the story of people sitting down to come up with what is essentially a constitution for a nationstate. what speech will be permitted? oft is the first element creating a way to regulate this community online. what was the ethos about that,
8:34 pm
what was the thinking. some of the younger employees didn't know any of that history. that is great if they are learning something. >> since her film, the "new york times" has done additional reporting. i wonder, for each of you, what would you like to do in your next reporting on facebook? what do you want to know? i still want to know who knew what when about some of these crucial issues. really -- it is very different from the "new york times" reporting. we didn't get into the leadership and their role in decision-making. think is critical because we are at a stage right now where we have big choices to make about not only facebook but other tech companies that have
8:35 pm
become so monopolistic that they are crushing any competition in innovation. to me, it is like the industrial revolution and reforms eventually came in. we are at that phase where we are maybe not quite there yet but at some point, people will have to start deciding how much power they will really give to them, now that they know how much power they have. the problem with knowing how much power they have is just coming to light now. is interesting. through a dual stock structure that give his shares bigger power, mark zuckerberg is the controlling shareholder in facebook. as a shareholder, he could pursue policies that did not maximize profits. the annual report reminds people of that. he could favor trading off revenue. did you see any evidence of
8:36 pm
altruism or civic participation in how mark zuckerberg is leading facebook? yes is the short answer. of meanshere are a lot inside of facebook. there is a department that is really thinking about how as a tool, it could help with all sorts of issues. one of the things that we have to remind ourselves of his this is still a very good service for people. ands something that we use many of us do. i think in emergencies it can be very useful. fundraising tools, getting information out quickly to networks. thing i thinkg
8:37 pm
to that when it came investments in security, when it came to investments in the company that really deserved more attention and there were people internally saying it needed more attention and more resources. i think they made some really bad decisions. the interesting thing is they are saying they are investing in that stuff. what that means when it comes to protecting elections from disinformation campaigns, bringing down fake accounts, things like that. we had the midterms, we need to see how things go further into the future. that is one thing we are interested in. i think the other thing is whether or not -- there is major questions still to be asked about the company's size and data.
8:38 pm
the issue of anti-hasn't been addressed yet in a really significant way. thinking about data differently and thinking about trust differently and what this company has and how well it is doing in terms of whether it benefits more than its actual consumers. one of the most fun things collaborating with intelligent journalists is the after hours talk. we get obsessed with the story. first we get obsessed with do we understand what it really is and all of its elements? we can talk it late at night through what is more important this element or that element? imagining kind of what are the solutions to these problems? we don't really address that in the film. it is something we talk about. we couldn't help but talk about. is would it bed
8:39 pm
possible to have a nonprofit facebook? what would that look like? would that actually solve the problem? are the problems created because make the most profit possible? what if you took money out of the equation? could you actually have what we like about facebook? i think that is still the idea that you can connect with your friends and family. on you, any reporting hope? anya: i am most interested in in questions.he same it is hard to make a film about that. i think that is one of the challenges in the process. comes down to storytelling, it is hard to still a tort -- tell a story about the future. i think those are the things that are most important and are
8:40 pm
most interesting. scale of theg, the problems is what is so mindnumbing. we saw that when we went to those meetings. my favorite line was we were filming this and there was a team dedicated to trying to fix the fake news problem and someone yelled out we need a fact checker for the middle east. as someone who has fact checked, that is a crazy thing to say. trying to fact check a two hour film takes weeks. they do care. genuinely very surprised by the impact the company has had around the world. trying to solve these problems is overwhelming. >> if you do have questions, please raise your hand and we will collect them. i will ask one more round.
8:41 pm
dana, in your role at the university of maryland, you actually teach classes and do research is about press freedom and misinformation. what lessons do you wish facebook managers would take to heart? that they are facilitating in some countries that are not democratic and they are facilitating nondemocratic forces. what happened were all arab spring the dictators realized we have to figure out this thing. that was a huge wave extended far beyond the middle east and said we will never let
8:42 pm
this happen again. all of those authoritarian what socialned media, including facebook, could do for them. think facebook was necessarily kept abreast of that . i teach in advance reporting class once a year where i give a student and imprisoned journalist they have to do and intimate profile of. they have to find family, colleagues, and all of that. i have learned a lot about the people who are imprisoned all around the world. the regimes that are imprisoning them have all the cards. they have the keys to the kingdom. to the extent that facebook doesn't realize that they empowered them in a way that maybe was inevitable that they facilitated that. they have some really
8:43 pm
.nteresting decisions to make country, but small it is a complete authoritarian regime. if they want to going to vietnam , they have to give them -- put the servers there and basically give them the names of the people using facebook. china wants the same thing. zuckerberg is enamored with haven'tere, although i followed it recently. what will happen there? what will happen in places like know their tools have been used for anti-democratic messes -- messages. what will they do about that? significant number of our
8:44 pm
journalism students are here tonight. i wondered what advice you would give to somebody who is very interested in accountability reporting? anya: i think if you are here, you should try to cover tech. this will be the beat for years to come. have thought a lot about students going into computer science and working in silicon valley. if our students end up working at a social media company, what would you like them to take away from your reporting? off thene thing right bat is no your history. know the history of the company you're working for.
8:45 pm
try to understand its prevailing os.er's one of the things that struck me was how little appreciation for history there was. history of authoritarians using tools for bad things. the mindset of silicon valley and how that is changed as well. capital, a venture lot of people talk to us about the earlier days of silicon i guessnd the ethos most embodied in a truer idealism about the powers of technology.
8:46 pm
there has been major shifts towards a much more mercenary aspect of the valley. i just think that what really struck me most in reporting this a lot of historical the people who we interviewed in these companies, not understanding security concerns, not understanding regulatory when industries grow enormous. these are things that are basic. trained.re people were earlier, about this of product design. whether it be the mental health aspects of it, the security problems that could be present, a lot of what we heard over the course of reporting is various
8:47 pm
divisions of the company weren't speaking to one another. there may not have been product designers who were designed to just keep shipping products, designing, programming, they were not consulting other people that dealt with real-world problems. that is something that absolutely has to change. it is something that these companies should be mindful of and something young employees going into these companies should be mindful of. mi communicating with all the people who i could potentially communicate with that might understand how practically something i am designing could either go right or wrong? dana: we were stepping out of our hotel, we were talking about analogy,d what was the is the defense industry and analogy? ?hat are the ethics everybody who is a programmer,
8:48 pm
developer, they should be thinking about. when they are building something that could be used for bad. when the first drone makers built the first drone, i'm sure they had no idea that it could be armed to kill people with pinpoint strikes all over the world. questions -- is it like medicine? what are the ethics of a doctor? what are the ethics of a lawyer? what are the ethics of someone who builds technology? , thistalked before winter, stanford started a computer ethics class that has 300 students in it. it is topped by a political philosopher, political scientist who served in the obama administration. there is a weekly writing requirement. some weeks that is code. there is actually a prerequisite
8:49 pm
to get into it. it meets four times each week. it has impart been driving people at stanford and they are thinking how can we broaden the things they take into account in the jobs that they hold? we will go to the unfettered lightning round portion of the event. i appreciate the questions, incredibly diverse. i will read the question and not guess who will answer it. just volunteer. first question, is there a google dilemma? [laughter] james: yes. sure, i think again, when it comes to the amount of data that google has on each of us, what they can do with that data, and in terms of algorithms and what
8:50 pm
in huge swaths of and whatation accountability mechanisms exists, there is definitely a dilemma. a third of it extensively we are getting for free. .e know there is a price we have to start knowing what that price is and no one has figured that out yet. says nda's inon california are one concern. isn't the real concern about employment blacklisting? in other words, will you be hired if you go public? dana: absolutely. that is a great way to put it. that was always implied. just -- it was definitely blacklisting
8:51 pm
employment wise, but blacklisting culturally among their tribe. people, theythe speak their language. dispelledf being because you have broken a rule was almost to me as powerful as breaking some legalistic term. there were a lot of frustrating experiences in the reporting process. a formerspeak to employee of facebook. whose real primary concern about not going on the record was in his or her nda, -- anya: the current employer would frown upon it. james: people would frown upon it. the next breath we would hear what i'm telling you is really
8:52 pm
important and you need to understand it. tore is no other way for us report it unless you help us reported. bridge to cross was not necessarily an nda problem. it was much more of a cultural problem. there is a reluctance certainly in tech to talk to journalists understandably because it is technical. a lot of things are technical, complicated, not black-and-white. we have to explain to the general public what they are doing. i think a lot of what people felt or a reluctance to talk may have been will wreak -- will we get the nuance of it? i think we are more position to do it more than any news organization because we had two hours to do this and we were publishing our transcripts. int of this is cultural
8:53 pm
silicon valley of you don't talk, you don't snitch. there is something we need to acknowledge on our end. tech reporting can often be wrong or can be problematic. you are taking really complicated things and you're trying to boil them down. on our part, there is certainly something we could probably do better to convince people to chat with us. dana: facebook spreads a lot of money around. in thinkfind formers tanks in washington or elsewhere. those think tanks get a lot of underwriting by facebook. disincentiveher out in a think tank to talk about. there is a lot of criticism of news organizations and a lot of it is warranted.
8:54 pm
many of the criticisms we are lodging are in fairness are true about news as well. you click onen facebook, it is the same problem. i think people are feeling really critical. it adds to their reluctance about going on the record because they are afraid they resonated for a reason. one thing we are interested next, our journalism industry operates in facebook world. salvation for the "the washington post," and commercial enterprises to find an audience. when you are operating as a journalist or an institution in that world where you can find
8:55 pm
your audience but you also may be playing to their bias to some degree, or that may be what comes up first. that is problematic. we have to do a better job as an industry of journalists and reporters in thinking that through of what it really means to operate when there is a few outlets, distributors, and facebook being a major one. tothe next question relates audience participation. raise your hand if you use facebook. that applies to the people on the panel too. question is, what responsibility do users have, given the knowledge you have shared in your film?
8:56 pm
it is not me, this is a question from the audience. certainly being media literate and knowing what is true and not true. , one the end of the film of the smart critics of the industry and is taken seriously in the valley for her critiques basically states that dilemma. it is that yes, i have a problem model, ir business have a problem with how they handled and addressed certain problems, i also have a problem because my friends and family are still there. the network effect of this invention is tremendous. as a communication tool, as a place to share things. i think each of us has that same dilemma.
8:57 pm
as users, there is a lot of ways i don't know if user revolt will work. the numbers, whether true or not, are still quite high in terms of how many users they have. dana: the market overseas is 90% of the market. 90% of facebook's market is overseas. story ishy the myanmar so wonderful and awa way. closed society to all of a sudden open and facebook and up being the major communications method. there is no media literacy. there is no tradition of journalism or truth. it becomes a platform for abuse.
8:58 pm
how much does facebook really liable for that? it is a more complicated question. >> it looks like people are turning to you for a lot of help. it says from your research, what would satisfactory change look like at facebook? the idea of putting it on the company is not necessarily something that is going to work. lack of regulation and a lack of policy. it is kind of like the principal of photosynthesis. toentivized as it may now be invest in the right things and making it a more secure and safe place for users, it still does
8:59 pm
have an market concern. i think we need to have a regulatory conversation. i think there is a lot of of proposals out there. substantiveens in a way without that. >> that is a great follow-up. the next question is what are the most important government regulations you could put into place to curb facebook? these are different people, one is pencil, one is ink. james: there are a lot of good ideas out there right now. one idea that is particularly interesting to us right now is thinking about data differently. whether or these companies should, in some way, be compensating users for their data. they need to disclose more what they are doing with their data. -- inether or not we can
9:00 pm
a way, we are talking globally about putting a price on carbon emissions. starto might need to talking about putting a price on data and figure out what the cost of it is to us all as a that these companies have so much data about us and what that means. that is just one idea. >> my local recommendation would be to get your profile. get the information that facebook has about you. it, you will see the drop-down menu. on the bottom, request my profile. just click on it, fill in the form, and it used to take days, now it takes hours. you will see what facebook has , atou and the associations
9:01 pm
least some of the associations it has on you. you -- myld give students were shocked, many of them, at how much it keeps and what the associations it makes for you that you don't even make for yourself, necessarily. you may click no thank you but it knows everybody that cut that invitation so you are associated with those people. it is just illuminating. i think people to understand more -- i think the film is good. to understand more what is behind this. and what do they know. and what are other people doing with that knowledge to people, not just to
9:02 pm
buy products, but to make political decisions. >> the u.s. has the benefit that europe has taken the lead, for better or worse, on dave -- data privacy laws. i think a lot of people would say it is a flawed experiment for a lot of interesting reasons. compliance costs, which are very high, favor the larger platforms. it may end up exacerbating the monopoly problem that we have. we at least have the ability to study how that will play out and have a more intelligent conversation of what might work here. that there is some sort of global regime that tackles this or if u.s. laws and regulation would do.
9:03 pm
>> regulating media companies, they are caught in fraud because of the first amendment. our mind around this content and that makes it a media company? media companye a because it is the largest distributor of news now. what do you do with that? do you protect it in the same way? >> and if you are talking about public affairs, there may be important stories the market doesn't support but help people in their role as citizens. 90's, look in the 80's or
9:04 pm
industries had family or individual ownership. got income for doing the right thing and contributing. and in silicon valley, you have companies that are publicly traded but the control rests with one person or in the case of google, three people. a related question was, what are $300 make of facebook's billion commitment to news that was just announced? >> if you are cynical, it is a public affairs thing. i am not cynical. people that don't realize the harm that was created.
9:05 pm
they probably do support real journalism. that 300, i know where some of it is going. it is going to locals, report for america. reporters that are going to -- thety newspapers to money is going to great things. i have looked at that, recently. it is a good thing they are doing that. it doesn't really change the problems we're talking about. remedy for the revenue problem at news organizations that, in part, happened because of facebook and others that the audience was on facebook. and facebook knew more about the users and readers of various publications.
9:06 pm
and so the whole revenue model of journalism was usually affected by facebook. it sounds like philanthropy. -- ihat we need to do don't know what the percentages are of income sharing from ad revenue on facebook when it comes to articles. what the washington post publishes on facebook, i do not know what percentage of ad revenue is going to the post. i know at first it was miniscule compared to what facebook would take from it. that was their prerogative. it was also hugely detrimental to the news industry. we basically need to address the revenue problem in journalism. that is the important thing. something for
9:07 pm
completely different. why do we care about anti-democracy content of that is the will of the users? >> that is a really interesting question and one that we talk about in various places. there are a lot of people that people in thelike philippines that support the president there. it is a question that i am not prepared to answer, but i certainly think it is part of the dilemma. it is one of the dilemmas that we are grappling with trying to figure out how to report. >> i have a much more black-and-white view of this. fact thatt with the we are a democracy and i think our national security is better served by other democracies even though we have a alliance's that are tight with authoritarian regimes.
9:08 pm
our whole foreign -- not our whole foreign policy but our altruistic foreign policy and our goals as a country have always been to promote the rule of law and democracy. we are willing to change our big strategic goals. that is a huge difference in what our values up to this point have been. part of what the documentary is about and part of what facebook needs to grapple with is the issue of fake accounts and others that are magnifying sentiment or magnifying different messaging. that was the case in the philippines, for instance. accountsetwork of fake
9:09 pm
in order to make it seem that there was more support for his policies and also to impact critics. the issue of fake accounts is a big one. especially when it comes to amplifying anti-democratic messages. >> calcify in propaganda in frica -- calcify in -- promoting propaganda in africa and telling others to suppress their own populace. >> it is not state propaganda on their platform. that even that is something that they are grappling with. where the u.n.
9:10 pm
said they facilitated genocide. they were trying to do something they weren't doing before but they were not 100% or 50% successful yet. it is very much a work in progress. they doubledd that moderatorsof content and other security people to combat these problems. the problem is, they grew so fast that it is almost -- i think it is almost impossible to get your hands around every problem that was created. that is both a testament to your success and how much people love facebook.
9:11 pm
in every language and every dialect, here they are in palo alto trying to deal with it. >> i don't know the specifics in africa or in vietnam. state runsue of how either media organizations or authoritarian states are using continue to be an isue for the company and it going to have to be something that we continue to report on. that is something that everyone kind of needs to get ahead of. thes it is becoming part of
9:12 pm
discussion, one person in the audience wants to know if it would make a difference if inebook was involved contacting officials to reduce regulation. not directly. that would be another documentary. >> it is happening with the internet association. most of the tech companies do not do packs with their names on them. they don't lobby directly. a do it through a group called the internet association which would be a great subject for a documentary because it is very cleverly done to be low profile. have everybody covered in the legislature that may have anything to do with regulation. one of my favorite examples of this is that in their washington
9:13 pm
office which keeps a rather low -- i wish i could remember who. they have staffers from the republican and democratic side that were key staffers to key regulators. they now employ them in the washington office, you know, to do anti-regulation. film, we speak to an early lobbyist that was there in that said it was absolutely part of the company's strategy before it began writing checks. part of the company's strategy in order to ingratiate it with politicians, helping them with their campaigns. this was the new place where campaigning would play out. a place to connect with the electorate. it is a smart way to do it. the company did outreach to politicians in order to make friends.
9:14 pm
say,ve that leverage to are you going to regulate us? is much less likely if they see facebook as a terrific tool for campaigning purposes to regulate them. and there was an explicit part according to a former employee, and expressive part of the strategy. -- an explicit part of the strategy. it has been effective in washington. one other former employee told who was very technically minded. often, he would be the guy that they turned to in the room that facebook or the internet association would turn to, and his line was that it wouldn't be
9:15 pm
technically possible. to which the politician has nothing to say. so that has been another effective strategy in order to are asking about is technically impossible. and that is something that was shared with us. there are all sorts of ways that they can exert their influence. is not just writing checks. perhaps that is changing. one person would like to know, was there ever a worry about a legal breakup of the company when you were interviewing people? they want fearful of the antitrust? >> no. even though the company has undergone a lot of tough reporting, a little bit of
9:16 pm
imperviousness to say. whether they are really getting a lot of the message. i am sure that their legal department is very concerned about the new york regulations. i am sure that they are in full effect and thinking through the proposals right now. the people we are speaking to is generally the sense of probably nothing major is going to change. >> we had a very interesting today's at facebook. at facebook. we were open about what we were looking for, which was basically their story. having worked with the military and controversial subjects. if you disclose what the story is you are working on, it is more likely -- if it is a hit
9:17 pm
job, you want to understand the complexities of the story. -- if it is not a hit job and you want to understand the complexities of the story, they will give you their story. you have an obligation to air it . we went in with a very sincere hope that they would tell us what is inside facebook right when the election happened. right when the russia thing broke. when the u.n. called them out about myanmar. tell us what that was like. it wasn't forthcoming. going back to the cia and the military, if you know that you have been given this opportunity, why aren't you taking it? one is that you just don't get it and you are in such a bubble
9:18 pm
that you haven't learned the most basic of affairs lessons. or that you have something to hide. impressed working on film and realizing how different it is from print. trying to let facebook people tell their case and make their best case for themselves. you see that the best case that they make is not an adequate case. they end up looking very much like we saw them. and we debated among ourselves, is that even fair? is it fair to put them out like they appear to us? right? because they didn't come off good. >> we show a montage of everyone saying the same thing. we really debated about that.
9:19 pm
they insisted no more than 30 minutes. not 31 minutes, 30 minutes. they were incredibly disciplined about that. we were not used to conducting interviews like that. it made everything very challenging. hour ofedited the first the film and we were really struggling. these things are always a crisis. the whole second hour should be inside facebook. it will be their story. we have to film a whole hour of interviews based on what we collect. we left totally freaked out because we didn't get enough entirel to constitute an hour and we didn't get enough insight to contain anything that would be interesting enough.
9:20 pm
we had to think the whole way we are going to approach it. >> it was another indication that they are not worldly. they are in their bubble and don't understand the risks outside their bubble. >> the last question, i have a feeling that you are going out afterwards and will be talking about this event. one thing that happens is i wish x?y had asked me the peopleyou like to know that we failed to draw out of you? this is the final exam. i am curious what we should do next.
9:21 pm
i am interested in people who want to come talk to us and give us ideas. >> i am curious what the critique is of the film. i don't know how many people have seen it. whether it is people that have worked there or who know it intimately in some way, i am curious as to what a potential critique would be. that. be challenged on >> it is a little bit more probing of what it was like an interviews. it was so fascinatingly different than any other place we have been. and i have been in so many different places. i kept having to check myself.
9:22 pm
am i seeing what i feel like i am seeing? are they really coming off so naive? why are they coming off so naive? is it because i'm getting older and they are young? my students,r young journalists, my kids -- they don't come off as naive? can they'll be putting out this shell because they are here? i don't really think so. there is a story i can't tell you that is a security story that symbolized it all. -- i don't think they understand the risks in the world. the world that they are operating in. government -- that
9:23 pm
is not really the right term, but it is a super nationstate. understand national security. goals, aims, and how they interact. how they interact with different countries. i am still dumbfounded about it. we do have professors grappling with that including teaching at stanford. they will be able to give you more nuanced answers. for your stellar work and sharing the story behind the story. [applause]
9:24 pm
[captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit ncicap.org] [captions copyright national cable satellite corp. 2019] c-span's washington journal, live every day with news and policy issues that have impacted you. susan up sunday morning, harleigh discusses the wall street tax act in 2019 that would impose taxes on certain trading transactions including stocks, bonds, and derivative deals. party talksy -- jim about his book, "alienated america." aged you'd hoped to be alienated america." why some places thrive when others collapse. >> the only thing that we have to fear is fear itself. >> ask not what your country can do for you. ask what you can do for your country.
9:25 pm
>> they will knock these buildings down, but -- >> c-span's newest book, "the president's." 34provides insight into the american presidents. interviews with noted presidential historians. the challenges they faced and the legacy they left behind. published by public affairs. it will be on shelves april 23. you can preorder your copy today at c-span.org/thepresidents. or wherever books are sold. >> congressman ralph hall passed away this last week at the age of 95. he served for 34 years and was
9:26 pm
the oldest person to ever serve in the house. he was also a world war ii veteran. congress, heeaving sat down with c-span for an interview about his life and career. this is 35 minutes. >> congressman ralph hall, you've been in the house of representatives since january of 1981, and you'd hoped to be here for one last term. the voters thought otherwise. how are you processing your departure? >> well, everything that i checked on during that that i was 10 to 12 points ahead. it told me one thing, don't depend on the people that tell you you're ahead and you're not. i really thought i had it won. coming back that night at 3:00 in the morning, i had to think as i was driving back out to my house how it happened. when i got home i pulled out old elections and checked to see how i did there.

74 Views

info Stream Only

Uploaded by TV Archive on